Everyone wants to be data-driven these days, so A/B testing is all the rage. Too often, however, I run into clients that run dozens of tests a month and come back with the disheartening results: Not enough repetitions to know the right answer.
This has probably happened to you. You have a critical page on your website that has a high exit rate. It is close to an important conversion page, so you know it will be valuable to optimize that page for clickthrough to that conversion. So, you run an A/B test. You have a few ideas of how to make things better, but after a couple of months, you still aren’t sure you have enough repetitions to know whether to believe the test.
How can you know?
One way is to go for statistical significance. That makes sense, but it isn’t always easy when you have low page view rates. How much is enough? It usually depends on how far apart the A and the B tests are. If they are wildly different, you don’t need as many repetitions. But that still begs the question.
Another way to think about this problem is not with an A/B test, but with an A/A test. There will be some random noise in any test. One way to detect how many repetitions you might need for statistical significance is to run an A/A test–where you use an A/B testing tool to run the exact same experience against itself. Once you get enough repetitions, the A/B test tool should tell you that there is no winner. The results should get closer and closer together as more repetitions occur.
You might be disappointed to find out that it will take months to declare a winner on some of your low-trafficked pages. If you have high-value conversions, you might want an answer sooner. Enter machine learning. It’s possible for trained machine learning tools to predict the conversion rate, bounce rates, and other statistics before your page is even published. Doing so allows you to make changes to at least get the page up to “average” before the hard work of A/B testing starts. If you have high-value pages on a low-trafficked website, you will be glad you did.